8 research outputs found

    Atomic norm denoising with applications to line spectral estimation

    Get PDF
    Motivated by recent work on atomic norms in inverse problems, we propose a new approach to line spectral estimation that provides theoretical guarantees for the mean-squared-error (MSE) performance in the presence of noise and without knowledge of the model order. We propose an abstract theory of denoising with atomic norms and specialize this theory to provide a convex optimization problem for estimating the frequencies and phases of a mixture of complex exponentials. We show that the associated convex optimization problem can be solved in polynomial time via semidefinite programming (SDP). We also show that the SDP can be approximated by an l1-regularized least-squares problem that achieves nearly the same error rate as the SDP but can scale to much larger problems. We compare both SDP and l1-based approaches with classical line spectral analysis methods and demonstrate that the SDP outperforms the l1 optimization which outperforms MUSIC, Cadzow's, and Matrix Pencil approaches in terms of MSE over a wide range of signal-to-noise ratios.Comment: 27 pages, 10 figures. A preliminary version of this work appeared in the Proceedings of the 49th Annual Allerton Conference in September 2011. Numerous numerical experiments added to this version in accordance with suggestions by anonymous reviewer

    Sparse recovery over continuous dictionaries-just discretize.

    No full text
    Abstract-In many applications of sparse recovery, the signal has a sparse representation only with respect to a continuously parameterized dictionary. Although atomic norm minimization provides a general framework to handle sparse recovery over continuous dictionaries, the computational aspects largely remain unclear. By establishing various convergence results as the discretization gets finer, we promote discretization as a universal and effective way to approximately solve the atomic norm minimization problem, especially when the dimension of the parameter space is low

    Near Minimax Line Spectral Estimation

    No full text
    This paper establishes a nearly optimal algorithm for estimating the frequencies and amplitudes of a mixture of sinusoids from noisy equispaced samples. We derive our algorithm by viewing line spectral estimation as a sparse recovery problem with a continuous, infinite dictionary. We show how to compute the estimator via semidefinite programming and provide guarantees on its mean-square error rate. We derive a complementary minimax lower bound on this estimation rate, demonstrating that our approach nearly achieves the best possible estimation error. Furthermore, we establish bounds on how well our estimator localizes the frequencies in the signal, showing that the localization error tends to zero as the number of samples grows. We verify our theoretical results in an array of numerical experiments, demonstrating that the semidefinite programming approach outperforms two classical spectral estimation techniques

    Atomic Norm Denoising With Applications to Line Spectral Estimation

    No full text

    Sketching Sparse Matrices

    No full text
    This paper considers the problem of recovering an unknown sparse p ⇥ p matrix X from an m ⇥ m matrix Y = AXBT,whereAand B are known m ⇥ p matrices with m ⌧ p. The main result shows that there exist constructions of the “sketching ” matrices A and B so that even if X has O(p) non-zeros, it can be recovered exactly and e ciently using a convex program as long as these non-zeros are not concentrated in any single row/column of X. Furthermore, it su ces for the size of Y (the sketch dimension) to scale as m = O p # nonzeros in X ⇥ log p. The results also show that the recovery is robust and stable in the sense that if X is equal to a sparse matrix plus a perturbation, then the convex program we propose produces an approximation with accuracy proportional to the size of the perturbation. Unlike traditional results on sparse recovery, where the sensing matrix produces independent measurements, our sensing operator is highly constrained (it assumes a tensor product structure). Therefore, proving recovery guarantees require non-standard techniques. Indeed our approach relies on a novel result concerning tensor products of bipartite graphs, which may be of independent interest. This problem is motivated by the following application, among others. Consider a p ⇥ n data matrix D, consisting of n observations of p variables. Assume that the correlation matrix X: = DDT is (approximately) sparse in the sense that each of the p variables is significantly correlated with only a few others. Our results show that these significant correlations can be detected even if we have access to only a sketch of th

    Compressed Sensing Off the Grid

    No full text
    corecore